The Noisy Expectation-Maximization Algorithm for Multiplicative Noise Injection

نویسندگان

  • Osonde Osoba
  • Bart Kosko
چکیده

We generalize the noisy expectation-maximization (NEM) algorithm to allow arbitrary modes of noise injection besides just adding noise to the data. The noise must still satisfy a NEM positivity condition. This generalization includes the important special case of multiplicative noise injection. A generalized NEM theorem shows that all measurable modes of injecting noise will speed the average convergence of the EM algorithm if the noise satisfies a generalized NEM positivity condition. This noise-benefit condition has a simple quadratic form for Gaussian and Cauchy mixture models in the case of multiplicative noise injection. Simulations show a multiplicative-noise EM speed-up of more than 27% in a Gaussian mixture model. Injecting blind noise only slowed convergence. A related theorem gives a sufficient condition for an average EM noise benefit for arbitrary modes of noise injection if the data model comes from the general exponential family of probability density functions. A final theorem shows that injected noise slows EM convergence on average if the NEM inequalities reverse and the noise satisfies a negativity condition.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Noisy Expectation-Maximization: Applications and Generalizations

We present a noise-injected version of the Expectation-Maximization (EM) algorithm: the Noisy Expectation Maximization (NEM) algorithm. The NEM algorithm uses noise to speed up the convergence of the EM algorithm. The NEM theorem shows that injected noise speeds up the average convergence of the EM algorithm to a local maximum of the likelihood surface if a positivity condition holds. The gener...

متن کامل

Adaptation method based on HMM composition and EM algorithm

A method for adapting HMMs to additive noise and multiplicative distortion at the same time is proposed. This method first creates a noise HMM for additive noise, then composes HMMs for noisy and distorted speech data from this HMM and speech HMMs so that these composed HMMs become the functions of signal-to-noise (SIN) ratio and multiplicative distortion. S/N ratio and multiplicative distortio...

متن کامل

Subset Selection under Noise

The problem of selecting the best k-element subset from a universe is involved in many applications. While previous studies assumed a noise-free environment or a noisy monotone submodular objective function, this paper considers a more realistic and general situation where the evaluation of a subset is a noisy monotone function (not necessarily submodular), with both multiplicative and additive...

متن کامل

Estimating noise from noisy speech features with a monte carlo variant of the expectation maximization algorithm

In this work, we derive a Monte Carlo expectation maximization algorithm for estimating noise from a noisy utterance. In contrast to earlier approaches, where the distribution of noise was estimated based on a vector Taylor series expansion, we use a combination of importance sampling and Parzen-window density estimation to numerically approximate the occurring integrals with the Monte Carlo me...

متن کامل

The Noisy Expectation-Maximization Algorithm

We present a noise-injected version of the Expectation-Maximization (EM) algorithm: the Noisy Expectation Maximization (NEM) algorithm. The NEM algorithm uses noise to speed up the convergence of the EM algorithm. The NEM theorem shows that additive noise speeds up the average convergence of the EM algorithm to a local maximum of the likelihood surface if a positivity condition holds. Corollary...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015